information à la source - translation to french
Diclib.com
ChatGPT AI Dictionary
Enter a word or phrase in any language 👆
Language:

Translation and analysis of words by ChatGPT artificial intelligence

On this page you can get a detailed analysis of a word or phrase, produced by the best artificial intelligence technology to date:

  • how the word is used
  • frequency of use
  • it is used more often in oral or written speech
  • word translation options
  • usage examples (several phrases with translation)
  • etymology

information à la source - translation to french

Source information rate

information à la source      
n. inside information

Definition

a la carte
[?: l?:'k?:t, a la]
¦ adjective (of a menu) listing food that can be ordered as separate items, rather than part of a set meal.
Origin
C19: Fr., lit. 'according to the (menu) card'.

Wikipedia

Entropy rate

In the mathematical theory of probability, the entropy rate or source information rate of a stochastic process is, informally, the time density of the average information in a stochastic process. For stochastic processes with a countable index, the entropy rate H ( X ) {\displaystyle H(X)} is the limit of the joint entropy of n {\displaystyle n} members of the process X k {\displaystyle X_{k}} divided by n {\displaystyle n} , as n {\displaystyle n} tends to infinity:

H ( X ) = lim n 1 n H ( X 1 , X 2 , X n ) {\displaystyle H(X)=\lim _{n\to \infty }{\frac {1}{n}}H(X_{1},X_{2},\dots X_{n})}

when the limit exists. An alternative, related quantity is:

H ( X ) = lim n H ( X n | X n 1 , X n 2 , X 1 ) {\displaystyle H'(X)=\lim _{n\to \infty }H(X_{n}|X_{n-1},X_{n-2},\dots X_{1})}

For strongly stationary stochastic processes, H ( X ) = H ( X ) {\displaystyle H(X)=H'(X)} . The entropy rate can be thought of as a general property of stochastic sources; this is the asymptotic equipartition property. The entropy rate may be used to estimate the complexity of stochastic processes. It is used in diverse applications ranging from characterizing the complexity of languages, blind source separation, through to optimizing quantizers and data compression algorithms. For example, a maximum entropy rate criterion may be used for feature selection in machine learning.